# Multilingual Medical QA
Mmed Llama 3 8B
MMed-Llama 3 is a multilingual medical foundation model based on the Llama 3 architecture, with 8 billion parameters. It has undergone further pre-training on the MMedC corpus to enhance medical domain knowledge.
Large Language Model
Transformers Supports Multiple Languages

M
Henrychur
1,763
25
Biomistral 7B
Apache-2.0
BioMistral is an open-source large language model optimized for the medical domain based on the Mistral architecture, further pre-trained on PubMed Central open-access text data, supporting multilingual medical question-answering tasks.
Large Language Model
Transformers Supports Multiple Languages

B
BioMistral
22.59k
428
Featured Recommended AI Models